88 research outputs found

    The INFN Open Access Repository Conceptual Design Report

    Get PDF
    This document presents the conceptual design of a long-term institutional Open Access Repository for the INFN. The motivations and the objectives as well as the state-of-the-art technologies available and some already existing examples are discussed. Both the current implementation and the proposed long-term solution are presented, together with the human and financial resources needed

    Creating and Managing Dynamic Cloud Federations

    Get PDF
    Cloud computing has evolved from a promising approach to the service provisioning to the reference model for all new data centres to build. Additionally, an increasing number of companies are choosing to migrate their business in the cloud "ecosystem" adopting the solutions developed by the biggest public Cloud Service Providers (CSPs). Smaller CSPs build their infrastructure on technologies available and to better support user activities and provide enough resources to their users, the federation could be a possible solution. In this work, we present different federation models, showing their strengths and weakness together with our considerations. Beside the highlighted existing federation we show the design of a new implementations under development at INFN aiming at maximising the scalability and flexibility of small and/or hybrid clouds by the introduction of a federation manager. This new component will support a seamless resources renting on the base of acceptance of federation agreements among operators. Additionally, we will discuss how the implementation of this model inside research institutes could help in the field of High Energy Physics with explicit reference at LHC experiments, digital humanities, life sciences and others

    e-Infrastructures for e-Science: A Global View

    Get PDF
    In the last 10 years, a new way of doing science is spreading in the world thank to the development of virtual research communities across many geographic and administrative boundaries. A virtual research community is a widely dispersed group of researchers and associated scientific instruments working together in a common virtual environment. This new kind of scientific environment, usually addressed as a "collaboratory", is based on the availability of high-speed networks and broadband access, advanced virtual tools and Grid-middleware technologies which, altogether, are the elements of the e-Infrastructures. The European Commission has heavily invested in promoting this new way of collaboration among scientists funding several international projects with the aim of creating e-Infrastructures to enable the European Research Area and connect the European researchers with their colleagues based in Africa, Asia and Latin America. In this paper we describe the actual status of these e- Infrastructures and present a complete picture of the virtual research communities currently using them. Information on the scientific domains and on the applications supported are provided together with their geographic distribution

    INDIGO-DataCloud: a Platform to Facilitate Seamless Access to E-Infrastructures

    Get PDF
    [EN] This paper describes the achievements of the H2020 project INDIGO-DataCloud. The project has provided e-infrastructures with tools, applications and cloud framework enhancements to manage the demanding requirements of scientific communities, either locally or through enhanced interfaces. The middleware developed allows to federate hybrid resources, to easily write, port and run scientific applications to the cloud. In particular, we have extended existing PaaS (Platform as a Service) solutions, allowing public and private e-infrastructures, including those provided by EGI, EUDAT, and Helix Nebula, to integrate their existing services and make them available through AAI services compliant with GEANT interfederation policies, thus guaranteeing transparency and trust in the provisioning of such services. Our middleware facilitates the execution of applications using containers on Cloud and Grid based infrastructures, as well as on HPC clusters. Our developments are freely downloadable as open source components, and are already being integrated into many scientific applications.INDIGO-Datacloud has been funded by the European Commision H2020 research and innovation program under grant agreement RIA 653549.Salomoni, D.; Campos, I.; Gaido, L.; Marco, J.; Solagna, P.; Gomes, J.; Matyska, L.... (2018). INDIGO-DataCloud: a Platform to Facilitate Seamless Access to E-Infrastructures. Journal of Grid Computing. 16(3):381-408. https://doi.org/10.1007/s10723-018-9453-3S381408163García, A.L., Castillo, E.F.-d., Puel, M.: Identity federation with VOMS in cloud infrastructures. In: 2013 IEEE 5Th International Conference on Cloud Computing Technology and Science, pp 42–48 (2013)Chadwick, D.W., Siu, K., Lee, C., Fouillat, Y., Germonville, D.: Adding federated identity management to OpenStack. Journal of Grid Computing 12(1), 3–27 (2014)Craig, A.L.: A design space review for general federation management using keystone. In: Proceedings of the 2014 IEEE/ACM 7th International Conference on Utility and Cloud Computing, pp 720–725. IEEE Computer Society (2014)Pustchi, N., Krishnan, R., Sandhu, R.: Authorization federation in iaas multi cloud. In: Proceedings of the 3rd International Workshop on Security in Cloud Computing, pp 63–71. ACM (2015)Lee, C.A., Desai, N., Brethorst, A.: A Keystone-Based Virtual Organization Management System. In: 2014 IEEE 6Th International Conference On Cloud Computing Technology and Science (Cloudcom), pp 727–730. IEEE (2014)Castillo, E.F.-d., Scardaci, D., García, A.L.: The EGI Federated Cloud e-Infrastructure. Procedia Computer Science 68, 196–205 (2015)AARC project: AARC Blueprint Architecture, see https://aarc-project.eu/architecture . Technical report (2016)Oesterle, F., Ostermann, S., Prodan, R., Mayr, G.J.: Experiences with distributed computing for meteorological applications: grid computing and cloud computing. Geosci. Model Dev. 8(7), 2067–2078 (2015)Plasencia, I.C., Castillo, E.F.-d., Heinemeyer, S., García, A.L., Pahlen, F., Borges, G.: Phenomenology tools on cloud infrastructures using OpenStack. The European Physical Journal C 73(4), 2375 (2013)Boettiger, C.: An introduction to docker for reproducible research. ACM SIGOPS Operating Systems Review 49(1), 71–79 (2015)Docker: http://www.docker.com (2013)Gomes, J., Campos, I., Bagnaschi, E., David, M., Alves, L., Martins, J., Pina, J., Alvaro, L.-G., Orviz, P.: Enabling rootless linux containers in multi-user environments: the udocker tool. Computing Physics Communications. https://doi.org/10.1016/j.cpc.2018.05.021 (2018)Zhang, Z., Chuan, W., Cheung, D.W.L.: A survey on cloud interoperability taxonomies, standards, and practice. SIGMETRICS perform. Eval. Rev. 40(4), 13–22 (2013)Lorido-Botran, T., Miguel-Alonso, J., Lozano, J.A.: A Review of Auto-scaling Techniques for Elastic Applications in Cloud Environments. Journal of Grid Computing 12(4), 559–592 (2014)Nyrén, R., Metsch, T., Edmonds, A., Papaspyrou, A.: Open Cloud Computing Interface–Core. Technical report, Open Grid Forum (2010)Metsch, T., Edmonds, A.: Open Cloud Computing Interface-Infrastructure. Technical report, Open Grid Forum (2010)Metsch, T., Edmonds, A.: Open Cloud Computing Interface-RESTful HTTP Rendering. Technical report, Open Grid Forum (2011)(Ca Technologies) Lipton, P., (Ibm) Moser, S., (Vnomic) Palma, D., (Ibm) Spatzier, T.: Topology and Orchestration Specification for Cloud Applications. Technical report, OASIS Standard (2013)Teckelmann, R., Reich, C., Sulistio, A.: Mapping of cloud standards to the taxonomy of interoperability in IaaS. In: Proceedings - 2011 3rd IEEE International Conference on Cloud Computing Technology and Science, CloudCom 2011, pp 522–526 (2011)García, A.L., Castillo, E.F.-d., Fernández, P.O.: Standards for enabling heterogeneous IaaS cloud federations. Computer Standards & Interfaces 47, 19–23 (2016)Caballer, M., Zala, S., García, A.L., Montó, G., Fernández, P.O., Velten, M.: Orchestrating complex application architectures in heterogeneous clouds. Journal of Grid Computing 16 (1), 3–18 (2018)Hardt, M., Jejkal, T., Plasencia, I.C., Castillo, E.F.-d., Jackson, A., Weiland, M., Palak, B., Plociennik, M., Nielsson, D.: Transparent Access to Scientific and Commercial Clouds from the Kepler Workflow Engine. Computing and Informatics 31(1), 119 (2012)Fakhfakh, F., Kacem, H.H., Kacem, A.H.: Workflow Scheduling in Cloud Computing a Survey. In: IEEE 18Th International Enterprise Distributed Object Computing Conference Workshops and Demonstrations (EDOCW), 2014, Vol. 71, pp. 372–378. Springer, New York (2014)Stockton, D.B., Santamaria, F.: Automating NEURON simulation deployment in cloud resources. Neuroinformatics 15(1), 51–70 (2017)Plóciennik, M., Fiore, S., Donvito, G., Owsiak, M., Fargetta, M., Barbera, R., Bruno, R., Giorgio, E., Williams, D.N., Aloisio, G.: Two-level Dynamic Workflow Orchestration in the INDIGO DataCloud for Large-scale, Climate Change Data Analytics Experiments. Procedia Computer Science 80, 722–733 (2016)Moreno-Vozmediano, R., Montero, R.S., Llorente, I.M.: Multicloud deployment of computing clusters for loosely coupled mtc applications. IEEE transactions on parallel and distributed systems 22(6), 924–930 (2011)Katsaros, G., Menzel, M., Lenk, A.: Cloud Service Orchestration with TOSCA, Chef and Openstack. In: Ic2e (2014)Garcia, A.L., Zangrando, L., Sgaravatto, M., Llorens, V., Vallero, S., Zaccolo, V., Bagnasco, S., Taneja, S., Dal Pra, S., Salomoni, D., Donvito, G.: Improved Cloud resource allocation: how INDIGO-DataCloud is overcoming the current limitations in Cloud schedulers. J. Phys. Conf. Ser. 898(9), 92010 (2017)Singh, S., Chana, I.: A survey on resource scheduling in cloud computing issues and challenges. Journal of Grid Computing, pp. 1–48 (2016)García, A.L., Castillo, E.F.-d., Fernández, P.O., Plasencia, I.C., de Lucas, J.M.: Resource provisioning in Science Clouds: Requirements and challenges. Software: Practice and Experience 48(3), 486–498 (2018)Chauhan, M.A., Babar, M.A., Benatallah, B.: Architecting cloud-enabled systems: a systematic survey of challenges and solutions. Software - Practice and Experience 47(4), 599–644 (2017)Somasundaram, T.S., Govindarajan, K.: CLOUDRB A Framework for scheduling and managing High-Performance Computing (HPC) applications in science cloud. Futur. Gener. Comput. Syst. 34, 47–65 (2014)Sotomayor, B., Keahey, K., Foster, I.: Overhead Matters: A Model for Virtual Resource Management. In: Proceedings of the 2nd International Workshop on Virtualization Technology in Distributed Computing SE - VTDC ’06, p 5. IEEE Computer Society, Washington (2006)SS, S.S., Shyam, G.K., Shyam, G.K.: Resource management for Infrastructure as a Service (IaaS) in cloud computing SS Manvi A survey. J. Netw. Comput. Appl. 41, 424–440 (2014)INDIGO-DataCloud consortium: Initial requirements from research communities - d2.1, see https://www.indigo-datacloud.eu/documents/initial-requirements-research-communities-d21 https://www.indigo-datacloud.eu/documents/initial-requirements-research-communities-d21 https://www.indigo-datacloud.eu/documents/initial-requirements-research-communities-d21 . Technical report (2015)Europen open science cloud: https://ec.europa.eu/research/openscience (2015)Proot: https://proot-me.github.io/ (2014)Runc: https://github.com/opencontainers/runc (2016)Fakechroot: https://github.com/dex4er/fakechroot (2015)Pérez, A., Moltó, G., Caballer, M., Calatrava, A.: Serverless computing for container-based architectures Future Generation Computer Systems (2018)de Vries, K.J.: Global fits of supersymmetric models after LHC run 1. Phd thesis Imperial College London (2015)Openstack: https://www.openstack.org/ (2015)See http://argus-documentation.readthedocs.io/en/stable/argus_introduction.html (2017)See https://en.wikipedia.org/wiki/xacml (2013)See http://www.simplecloud.info (2014)Opennebula: http://opennebula.org/ (2018)Redhat openshift: http://www.opencityplatform.eu (2011)The cloud foundry foundation: https://www.cloudfoundry.org/ (2015)Caballer, M., Blanquer, I., Moltó, G., de Alfonso, C.: Dynamic management of virtual infrastructures. Journal of Grid Computing 13(1), 53–70 (2015)See http://www.infoq.com/articles/scaling-docker-with-kubernetes http://www.infoq.com/articles/scaling-docker-with-kubernetes (2014)Prisma project: http://www.ponsmartcities-prisma.it/ (2010)Opencitiy platform: http://www.opencityplatform.eu (2014)Onedata: https://onedata.org/ (2018)Dynafed: http://lcgdm.web.cern.ch/dynafed-dynamic-federation-project http://lcgdm.web.cern.ch/dynafed-dynamic-federation-project (2011)Fts3: https://svnweb.cern.ch/trac/fts3 (2011)Fernández, P.O., García, A.L., Duma, D.C., Donvito, G., David, M., Gomes, J.: A set of common software quality assurance baseline criteria for research projects, see http://hdl.handle.net/10261/160086 . Technical reportHttermann, M.: Devops for developers Apress (2012)EOSC-Hub: ”Integrating and managing services for the European Open Science Cloud” Funded by H2020 research and innovation pr ogramme under grant agreement No. 777536. See http://eosc-hub.eu (2018)Apache License: author = https://www.apache.org/licenses/LICENSE-2.0 (2004)INDIGO Package Repo: http://repo.indigo-datacloud.eu/ (2017)INDIGO DockerHub: https://hub.docker.com/u/indigodatacloud/ https://hub.docker.com/u/indigodatacloud/ (2015)Indigo gitbook: https://indigo-dc.gitbooks.io/indigo-datacloud-releases https://indigo-dc.gitbooks.io/indigo-datacloud-releases (2017)Van Zundert, G.C., Bonvin, A.M.: Disvis: quantifying and visualizing the accessible interaction space of distance restrained biomolecular complexes. Bioinformatics 31(19), 3222–3224 (2015)Van Zundert, G.C., Bonvin, A.M.: Fast and sensitive rigid–body fitting into cryo–em density maps with powerfit. AIMS Biophys. 2(0273), 73–87 (2015

    Radioactivity control strategy for the JUNO detector

    Get PDF
    602siopenJUNO is a massive liquid scintillator detector with a primary scientific goal of determining the neutrino mass ordering by studying the oscillated anti-neutrino flux coming from two nuclear power plants at 53 km distance. The expected signal anti-neutrino interaction rate is only 60 counts per day (cpd), therefore a careful control of the background sources due to radioactivity is critical. In particular, natural radioactivity present in all materials and in the environment represents a serious issue that could impair the sensitivity of the experiment if appropriate countermeasures were not foreseen. In this paper we discuss the background reduction strategies undertaken by the JUNO collaboration to reduce at minimum the impact of natural radioactivity. We describe our efforts for an optimized experimental design, a careful material screening and accurate detector production handling, and a constant control of the expected results through a meticulous Monte Carlo simulation program. We show that all these actions should allow us to keep the background count rate safely below the target value of 10 Hz (i.e. ∼1 cpd accidental background) in the default fiducial volume, above an energy threshold of 0.7 MeV. [Figure not available: see fulltext.]openAbusleme A.; Adam T.; Ahmad S.; Ahmed R.; Aiello S.; Akram M.; An F.; An Q.; Andronico G.; Anfimov N.; Antonelli V.; Antoshkina T.; Asavapibhop B.; de Andre J.P.A.M.; Auguste D.; Babic A.; Baldini W.; Barresi A.; Basilico D.; Baussan E.; Bellato M.; Bergnoli A.; Birkenfeld T.; Blin S.; Blum D.; Blyth S.; Bolshakova A.; Bongrand M.; Bordereau C.; Breton D.; Brigatti A.; Brugnera R.; Bruno R.; Budano A.; Buscemi M.; Busto J.; Butorov I.; Cabrera A.; Cai H.; Cai X.; Cai Y.; Cai Z.; Cammi A.; Campeny A.; Cao C.; Cao G.; Cao J.; Caruso R.; Cerna C.; Chang J.; Chang Y.; Chen P.; Chen P.-A.; Chen S.; Chen X.; Chen Y.-W.; Chen Y.; Chen Y.; Chen Z.; Cheng J.; Cheng Y.; Chetverikov A.; Chiesa D.; Chimenti P.; Chukanov A.; Claverie G.; Clementi C.; Clerbaux B.; Conforti Di Lorenzo S.; Corti D.; Cremonesi O.; Dal Corso F.; Dalager O.; De La Taille C.; Deng J.; Deng Z.; Deng Z.; Depnering W.; Diaz M.; Ding X.; Ding Y.; Dirgantara B.; Dmitrievsky S.; Dohnal T.; Dolzhikov D.; Donchenko G.; Dong J.; Doroshkevich E.; Dracos M.; Druillole F.; Du S.; Dusini S.; Dvorak M.; Enqvist T.; Enzmann H.; Fabbri A.; Fajt L.; Fan D.; Fan L.; Fang J.; Fang W.; Fargetta M.; Fedoseev D.; Fekete V.; Feng L.-C.; Feng Q.; Ford R.; Formozov A.; Fournier A.; Gan H.; Gao F.; Garfagnini A.; Giammarchi M.; Giaz A.; Giudice N.; Gonchar M.; Gong G.; Gong H.; Gornushkin Y.; Gottel A.; Grassi M.; Grewing C.; Gromov V.; Gu M.; Gu X.; Gu Y.; Guan M.; Guardone N.; Gul M.; Guo C.; Guo J.; Guo W.; Guo X.; Guo Y.; Hackspacher P.; Hagner C.; Han R.; Han Y.; Hassan M.S.; He M.; He W.; Heinz T.; Hellmuth P.; Heng Y.; Herrera R.; Hor Y.K.; Hou S.; Hsiung Y.; Hu B.-Z.; Hu H.; Hu J.; Hu J.; Hu S.; Hu T.; Hu Z.; Huang C.; Huang G.; Huang H.; Huang W.; Huang X.; Huang X.; Huang Y.; Hui J.; Huo L.; Huo W.; Huss C.; Hussain S.; Ioannisian A.; Isocrate R.; Jelmini B.; Jen K.-L.; Jeria I.; Ji X.; Ji X.; Jia H.; Jia J.; Jian S.; Jiang D.; Jiang X.; Jin R.; Jing X.; Jollet C.; Joutsenvaara J.; Jungthawan S.; Kalousis L.; Kampmann P.; Kang L.; Karaparambil R.; Kazarian N.; Khan W.; Khosonthongkee K.; Korablev D.; Kouzakov K.; Krasnoperov A.; Kruth A.; Kutovskiy N.; Kuusiniemi P.; Lachenmaier T.; Landini C.; Leblanc S.; Lebrin V.; Lefevre F.; Lei R.; Leitner R.; Leung J.; Li D.; Li F.; Li F.; Li H.; Li H.; Li J.; Li M.; Li M.; Li N.; Li N.; Li Q.; Li R.; Li S.; Li T.; Li W.; Li W.; Li X.; Li X.; Li X.; Li Y.; Li Y.; Li Z.; Li Z.; Li Z.; Liang H.; Liang H.; Liao J.; Liebau D.; Limphirat A.; Limpijumnong S.; Lin G.-L.; Lin S.; Lin T.; Ling J.; Lippi I.; Liu F.; Liu H.; Liu H.; Liu H.; Liu H.; Liu H.; Liu J.; Liu J.; Liu M.; Liu Q.; Liu Q.; Liu R.; Liu S.; Liu S.; Liu S.; Liu X.; Liu X.; Liu Y.; Liu Y.; Lokhov A.; Lombardi P.; Lombardo C.; Loo K.; Lu C.; Lu H.; Lu J.; Lu J.; Lu S.; Lu X.; Lubsandorzhiev B.; Lubsandorzhiev S.; Ludhova L.; Luo F.; Luo G.; Luo P.; Luo S.; Luo W.; Lyashuk V.; Ma B.; Ma Q.; Ma S.; Ma X.; Ma X.; Maalmi J.; Malyshkin Y.; Mantovani F.; Manzali F.; Mao X.; Mao Y.; Mari S.M.; Marini F.; Marium S.; Martellini C.; Martin-Chassard G.; Martini A.; Mayer M.; Mayilyan D.; Mednieks I.; Meng Y.; Meregaglia A.; Meroni E.; Meyhofer D.; Mezzetto M.; Miller J.; Miramonti L.; Montini P.; Montuschi M.; Muller A.; Nastasi M.; Naumov D.V.; Naumova E.; Navas-Nicolas D.; Nemchenok I.; Nguyen Thi M.T.; Ning F.; Ning Z.; Nunokawa H.; Oberauer L.; Ochoa-Ricoux J.P.; Olshevskiy A.; Orestano D.; Ortica F.; Othegraven R.; Pan H.-R.; Paoloni A.; Parmeggiano S.; Pei Y.; Pelliccia N.; Peng A.; Peng H.; Perrot F.; Petitjean P.-A.; Petrucci F.; Pilarczyk O.; Pineres Rico L.F.; Popov A.; Poussot P.; Pratumwan W.; Previtali E.; Qi F.; Qi M.; Qian S.; Qian X.; Qian Z.; Qiao H.; Qin Z.; Qiu S.; Rajput M.U.; Ranucci G.; Raper N.; Re A.; Rebber H.; Rebii A.; Ren B.; Ren J.; Ricci B.; Robens M.; Roche M.; Rodphai N.; Romani A.; Roskovec B.; Roth C.; Ruan X.; Ruan X.; Rujirawat S.; Rybnikov A.; Sadovsky A.; Saggese P.; Sanfilippo S.; Sangka A.; Sanguansak N.; Sawangwit U.; Sawatzki J.; Sawy F.; Schever M.; Schwab C.; Schweizer K.; Selyunin A.; Serafini A.; Settanta G.; Settimo M.; Shao Z.; Sharov V.; Shaydurova A.; Shi J.; Shi Y.; Shutov V.; Sidorenkov A.; Simkovic F.; Sirignano C.; Siripak J.; Sisti M.; Slupecki M.; Smirnov M.; Smirnov O.; Sogo-Bezerra T.; Sokolov S.; Songwadhana J.; Soonthornthum B.; Sotnikov A.; Sramek O.; Sreethawong W.; Stahl A.; Stanco L.; Stankevich K.; Stefanik D.; Steiger H.; Steinmann J.; Sterr T.; Stock M.R.; Strati V.; Studenikin A.; Sun S.; Sun X.; Sun Y.; Sun Y.; Suwonjandee N.; Szelezniak M.; Tang J.; Tang Q.; Tang Q.; Tang X.; Tietzsch A.; Tkachev I.; Tmej T.; Treskov K.; Triossi A.; Troni G.; Trzaska W.; Tuve C.; Ushakov N.; van den Boom J.; van Waasen S.; Vanroyen G.; Vassilopoulos N.; Vedin V.; Verde G.; Vialkov M.; Viaud B.; Vollbrecht M.C.; Volpe C.; Vorobel V.; Voronin D.; Votano L.; Walker P.; Wang C.; Wang C.-H.; Wang E.; Wang G.; Wang J.; Wang J.; Wang K.; Wang L.; Wang M.; Wang M.; Wang M.; Wang R.; Wang S.; Wang W.; Wang W.; Wang W.; Wang X.; Wang X.; Wang Y.; Wang Y.; Wang Y.; Wang Y.; Wang Y.; Wang Y.; Wang Y.; Wang Z.; Wang Z.; Wang Z.; Wang Z.; Waqas M.; Watcharangkool A.; Wei L.; Wei W.; Wei W.; Wei Y.; Wen L.; Wiebusch C.; Wong S.C.-F.; Wonsak B.; Wu D.; Wu F.; Wu Q.; Wu Z.; Wurm M.; Wurtz J.; Wysotzki C.; Xi Y.; Xia D.; Xie X.; Xie Y.; Xie Z.; Xing Z.; Xu B.; Xu C.; Xu D.; Xu F.; Xu H.; Xu J.; Xu J.; Xu M.; Xu Y.; Xu Y.; Yan B.; Yan T.; Yan W.; Yan X.; Yan Y.; Yang A.; Yang C.; Yang C.; Yang H.; Yang J.; Yang L.; Yang X.; Yang Y.; Yang Y.; Yao H.; Yasin Z.; Ye J.; Ye M.; Ye Z.; Yegin U.; Yermia F.; Yi P.; Yin N.; Yin X.; You Z.; Yu B.; Yu C.; Yu C.; Yu H.; Yu M.; Yu X.; Yu Z.; Yu Z.; Yuan C.; Yuan Y.; Yuan Z.; Yuan Z.; Yue B.; Zafar N.; Zambanini A.; Zavadskyi V.; Zeng S.; Zeng T.; Zeng Y.; Zhan L.; Zhang A.; Zhang F.; Zhang G.; Zhang H.; Zhang H.; Zhang J.; Zhang J.; Zhang J.; Zhang J.; Zhang J.; Zhang P.; Zhang Q.; Zhang S.; Zhang S.; Zhang T.; Zhang X.; Zhang X.; Zhang X.; Zhang Y.; Zhang Y.; Zhang Y.; Zhang Y.; Zhang Y.; Zhang Y.; Zhang Z.; Zhang Z.; Zhao F.; Zhao J.; Zhao R.; Zhao S.; Zhao T.; Zheng D.; Zheng H.; Zheng M.; Zheng Y.; Zhong W.; Zhou J.; Zhou L.; Zhou N.; Zhou S.; Zhou T.; Zhou X.; Zhu J.; Zhu K.; Zhu K.; Zhu Z.; Zhuang B.; Zhuang H.; Zong L.; Zou J.Abusleme, A.; Adam, T.; Ahmad, S.; Ahmed, R.; Aiello, S.; Akram, M.; An, F.; An, Q.; Andronico, G.; Anfimov, N.; Antonelli, V.; Antoshkina, T.; Asavapibhop, B.; de Andre, J. P. A. M.; Auguste, D.; Babic, A.; Baldini, W.; Barresi, A.; Basilico, D.; Baussan, E.; Bellato, M.; Bergnoli, A.; Birkenfeld, T.; Blin, S.; Blum, D.; Blyth, S.; Bolshakova, A.; Bongrand, M.; Bordereau, C.; Breton, D.; Brigatti, A.; Brugnera, R.; Bruno, R.; Budano, A.; Buscemi, M.; Busto, J.; Butorov, I.; Cabrera, A.; Cai, H.; Cai, X.; Cai, Y.; Cai, Z.; Cammi, A.; Campeny, A.; Cao, C.; Cao, G.; Cao, J.; Caruso, R.; Cerna, C.; Chang, J.; Chang, Y.; Chen, P.; Chen, P. -A.; Chen, S.; Chen, X.; Chen, Y. -W.; Chen, Y.; Chen, Y.; Chen, Z.; Cheng, J.; Cheng, Y.; Chetverikov, A.; Chiesa, D.; Chimenti, P.; Chukanov, A.; Claverie, G.; Clementi, C.; Clerbaux, B.; Conforti Di Lorenzo, S.; Corti, D.; Cremonesi, O.; Dal Corso, F.; Dalager, O.; De La Taille, C.; Deng, J.; Deng, Z.; Deng, Z.; Depnering, W.; Diaz, M.; Ding, X.; Ding, Y.; Dirgantara, B.; Dmitrievsky, S.; Dohnal, T.; Dolzhikov, D.; Donchenko, G.; Dong, J.; Doroshkevich, E.; Dracos, M.; Druillole, F.; Du, S.; Dusini, S.; Dvorak, M.; Enqvist, T.; Enzmann, H.; Fabbri, A.; Fajt, L.; Fan, D.; Fan, L.; Fang, J.; Fang, W.; Fargetta, M.; Fedoseev, D.; Fekete, V.; Feng, L. -C.; Feng, Q.; Ford, R.; Formozov, A.; Fournier, A.; Gan, H.; Gao, F.; Garfagnini, A.; Giammarchi, M.; Giaz, A.; Giudice, N.; Gonchar, M.; Gong, G.; Gong, H.; Gornushkin, Y.; Gottel, A.; Grassi, M.; Grewing, C.; Gromov, V.; Gu, M.; Gu, X.; Gu, Y.; Guan, M.; Guardone, N.; Gul, M.; Guo, C.; Guo, J.; Guo, W.; Guo, X.; Guo, Y.; Hackspacher, P.; Hagner, C.; Han, R.; Han, Y.; Hassan, M. S.; He, M.; He, W.; Heinz, T.; Hellmuth, P.; Heng, Y.; Herrera, R.; Hor, Y. K.; Hou, S.; Hsiung, Y.; Hu, B. -Z.; Hu, H.; Hu, J.; Hu, J.; Hu, S.; Hu, T.; Hu, Z.; Huang, C.; Huang, G.; Huang, H.; Huang, W.; Huang, X.; Huang, X.; Huang, Y.; Hui, J.; Huo, L.; Huo, W.; Huss, C.; Hussain, S.; Ioannisian, A.; Isocrate, R.; Jelmini, B.; Jen, K. -L.; Jeria, I.; Ji, X.; Ji, X.; Jia, H.; Jia, J.; Jian, S.; Jiang, D.; Jiang, X.; Jin, R.; Jing, X.; Jollet, C.; Joutsenvaara, J.; Jungthawan, S.; Kalousis, L.; Kampmann, P.; Kang, L.; Karaparambil, R.; Kazarian, N.; Khan, W.; Khosonthongkee, K.; Korablev, D.; Kouzakov, K.; Krasnoperov, A.; Kruth, A.; Kutovskiy, N.; Kuusiniemi, P.; Lachenmaier, T.; Landini, C.; Leblanc, S.; Lebrin, V.; Lefevre, F.; Lei, R.; Leitner, R.; Leung, J.; Li, D.; Li, F.; Li, F.; Li, H.; Li, H.; Li, J.; Li, M.; Li, M.; Li, N.; Li, N.; Li, Q.; Li, R.; Li, S.; Li, T.; Li, W.; Li, W.; Li, X.; Li, X.; Li, X.; Li, Y.; Li, Y.; Li, Z.; Li, Z.; Li, Z.; Liang, H.; Liang, H.; Liao, J.; Liebau, D.; Limphirat, A.; Limpijumnong, S.; Lin, G. -L.; Lin, S.; Lin, T.; Ling, J.; Lippi, I.; Liu, F.; Liu, H.; Liu, H.; Liu, H.; Liu, H.; Liu, H.; Liu, J.; Liu, J.; Liu, M.; Liu, Q.; Liu, Q.; Liu, R.; Liu, S.; Liu, S.; Liu, S.; Liu, X.; Liu, X.; Liu, Y.; Liu, Y.; Lokhov, A.; Lombardi, P.; Lombardo, C.; Loo, K.; Lu, C.; Lu, H.; Lu, J.; Lu, J.; Lu, S.; Lu, X.; Lubsandorzhiev, B.; Lubsandorzhiev, S.; Ludhova, L.; Luo, F.; Luo, G.; Luo, P.; Luo, S.; Luo, W.; Lyashuk, V.; Ma, B.; Ma, Q.; Ma, S.; Ma, X.; Ma, X.; Maalmi, J.; Malyshkin, Y.; Mantovani, F.; Manzali, F.; Mao, X.; Mao, Y.; Mari, S. M.; Marini, F.; Marium, S.; Martellini, C.; Martin-Chassard, G.; Martini, A.; Mayer, M.; Mayilyan, D.; Mednieks, I.; Meng, Y.; Meregaglia, A.; Meroni, E.; Meyhofer, D.; Mezzetto, M.; Miller, J.; Miramonti, L.; Montini, P.; Montuschi, M.; Muller, A.; Nastasi, M.; Naumov, D. V.; Naumova, E.; Navas-Nicolas, D.; Nemchenok, I.; Nguyen Thi, M. T.; Ning, F.; Ning, Z.; Nunokawa, H.; Oberauer, L.; Ochoa-Ricoux, J. P.; Olshevskiy, A.; Orestano, D.; Ortica, F.; Othegraven, R.; Pan, H. -R.; Paoloni, A.; Parmeggiano, S.; Pei, Y.; Pelliccia, N.; Peng, A.; Peng, H.; Perrot, F.; Petitjean, P. -A.; Petrucci, F.; Pilarczyk, O.; Pineres Rico, L. F.; Popov, A.; Poussot, P.; Pratumwan, W.; Previtali, E.; Qi, F.; Qi, M.; Qian, S.; Qian, X.; Qian, Z.; Qiao, H.; Qin, Z.; Qiu, S.; Rajput, M. U.; Ranucci, G.; Raper, N.; Re, A.; Rebber, H.; Rebii, A.; Ren, B.; Ren, J.; Ricci, B.; Robens, M.; Roche, M.; Rodphai, N.; Romani, A.; Roskovec, B.; Roth, C.; Ruan, X.; Ruan, X.; Rujirawat, S.; Rybnikov, A.; Sadovsky, A.; Saggese, P.; Sanfilippo, S.; Sangka, A.; Sanguansak, N.; Sawangwit, U.; Sawatzki, J.; Sawy, F.; Schever, M.; Schwab, C.; Schweizer, K.; Selyunin, A.; Serafini, A.; Settanta, G.; Settimo, M.; Shao, Z.; Sharov, V.; Shaydurova, A.; Shi, J.; Shi, Y.; Shutov, V.; Sidorenkov, A.; Simkovic, F.; Sirignano, C.; Siripak, J.; Sisti, M.; Slupecki, M.; Smirnov, M.; Smirnov, O.; Sogo-Bezerra, T.; Sokolov, S.; Songwadhana, J.; Soonthornthum, B.; Sotnikov, A.; Sramek, O.; Sreethawong, W.; Stahl, A.; Stanco, L.; Stankevich, K.; Stefanik, D.; Steiger, H.; Steinmann, J.; Sterr, T.; Stock, M. R.; Strati, V.; Studenikin, A.; Sun, S.; Sun, X.; Sun, Y.; Sun, Y.; Suwonjandee, N.; Szelezniak, M.; Tang, J.; Tang, Q.; Tang, Q.; Tang, X.; Tietzsch, A.; Tkachev, I.; Tmej, T.; Treskov, K.; Triossi, A.; Troni, G.; Trzaska, W.; Tuve, C.; Ushakov, N.; van den Boom, J.; van Waasen, S.; Vanroyen, G.; Vassilopoulos, N.; Vedin, V.; Verde, G.; Vialkov, M.; Viaud, B.; Vollbrecht, M. C.; Volpe, C.; Vorobel, V.; Voronin, D.; Votano, L.; Walker, P.; Wang, C.; Wang, C. -H.; Wang, E.; Wang, G.; Wang, J.; Wang, J.; Wang, K.; Wang, L.; Wang, M.; Wang, M.; Wang, M.; Wang, R.; Wang, S.; Wang, W.; Wang, W.; Wang, W.; Wang, X.; Wang, X.; Wang, Y.; Wang, Y.; Wang, Y.; Wang, Y.; Wang, Y.; Wang, Y.; Wang, Y.; Wang, Z.; Wang, Z.; Wang, Z.; Wang, Z.; Waqas, M.; Watcharangkool, A.; Wei, L.; Wei, W.; Wei, W.; Wei, Y.; Wen, L.; Wiebusch, C.; Wong, S. C. -F.; Wonsak, B.; Wu, D.; Wu, F.; Wu, Q.; Wu, Z.; Wurm, M.; Wurtz, J.; Wysotzki, C.; Xi, Y.; Xia, D.; Xie, X.; Xie, Y.; Xie, Z.; Xing, Z.; Xu, B.; Xu, C.; Xu, D.; Xu, F.; Xu, H.; Xu, J.; Xu, J.; Xu, M.; Xu, Y.; Xu, Y.; Yan, B.; Yan, T.; Yan, W.; Yan, X.; Yan, Y.; Yang, A.; Yang, C.; Yang, C.; Yang, H.; Yang, J.; Yang, L.; Yang, X.; Yang, Y.; Yang, Y.; Yao, H.; Yasin, Z.; Ye, J.; Ye, M.; Ye, Z.; Yegin, U.; Yermia, F.; Yi, P.; Yin, N.; Yin, X.; You, Z.; Yu, B.; Yu, C.; Yu, C.; Yu, H.; Yu, M.; Yu, X.; Yu, Z.; Yu, Z.; Yuan, C.; Yuan, Y.; Yuan, Z.; Yuan, Z.; Yue, B.; Zafar, N.; Zambanini, A.; Zavadskyi, V.; Zeng, S.; Zeng, T.; Zeng, Y.; Zhan, L.; Zhang, A.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, H.; Zhang, J.; Zhang, J.; Zhang, J.; Zhang, J.; Zhang, J.; Zhang, P.; Zhang, Q.; Zhang, S.; Zhang, S.; Zhang, T.; Zhang, X.; Zhang, X.; Zhang, X.; Zhang, Y.; Zhang, Y.; Zhang, Y.; Zhang, Y.; Zhang, Y.; Zhang, Y.; Zhang, Z.; Zhang, Z.; Zhao, F.; Zhao, J.; Zhao, R.; Zhao, S.; Zhao, T.; Zheng, D.; Zheng, H.; Zheng, M.; Zheng, Y.; Zhong, W.; Zhou, J.; Zhou, L.; Zhou, N.; Zhou, S.; Zhou, T.; Zhou, X.; Zhu, J.; Zhu, K.; Zhu, K.; Zhu, Z.; Zhuang, B.; Zhuang, H.; Zong, L.; Zou, J

    Potential of Core-Collapse Supernova Neutrino Detection at JUNO

    Get PDF
    JUNO is an underground neutrino observatory under construction in Jiangmen, China. It uses 20kton liquid scintillator as target, which enables it to detect supernova burst neutrinos of a large statistics for the next galactic core-collapse supernova (CCSN) and also pre-supernova neutrinos from the nearby CCSN progenitors. All flavors of supernova burst neutrinos can be detected by JUNO via several interaction channels, including inverse beta decay, elastic scattering on electron and proton, interactions on C12 nuclei, etc. This retains the possibility for JUNO to reconstruct the energy spectra of supernova burst neutrinos of all flavors. The real time monitoring systems based on FPGA and DAQ are under development in JUNO, which allow prompt alert and trigger-less data acquisition of CCSN events. The alert performances of both monitoring systems have been thoroughly studied using simulations. Moreover, once a CCSN is tagged, the system can give fast characterizations, such as directionality and light curve

    Detection of the Diffuse Supernova Neutrino Background with JUNO

    Get PDF
    As an underground multi-purpose neutrino detector with 20 kton liquid scintillator, Jiangmen Underground Neutrino Observatory (JUNO) is competitive with and complementary to the water-Cherenkov detectors on the search for the diffuse supernova neutrino background (DSNB). Typical supernova models predict 2-4 events per year within the optimal observation window in the JUNO detector. The dominant background is from the neutral-current (NC) interaction of atmospheric neutrinos with 12C nuclei, which surpasses the DSNB by more than one order of magnitude. We evaluated the systematic uncertainty of NC background from the spread of a variety of data-driven models and further developed a method to determine NC background within 15\% with {\it{in}} {\it{situ}} measurements after ten years of running. Besides, the NC-like backgrounds can be effectively suppressed by the intrinsic pulse-shape discrimination (PSD) capabilities of liquid scintillators. In this talk, I will present in detail the improvements on NC background uncertainty evaluation, PSD discriminator development, and finally, the potential of DSNB sensitivity in JUNO

    Real-time Monitoring for the Next Core-Collapse Supernova in JUNO

    Full text link
    Core-collapse supernova (CCSN) is one of the most energetic astrophysical events in the Universe. The early and prompt detection of neutrinos before (pre-SN) and during the SN burst is a unique opportunity to realize the multi-messenger observation of the CCSN events. In this work, we describe the monitoring concept and present the sensitivity of the system to the pre-SN and SN neutrinos at the Jiangmen Underground Neutrino Observatory (JUNO), which is a 20 kton liquid scintillator detector under construction in South China. The real-time monitoring system is designed with both the prompt monitors on the electronic board and online monitors at the data acquisition stage, in order to ensure both the alert speed and alert coverage of progenitor stars. By assuming a false alert rate of 1 per year, this monitoring system can be sensitive to the pre-SN neutrinos up to the distance of about 1.6 (0.9) kpc and SN neutrinos up to about 370 (360) kpc for a progenitor mass of 30MM_{\odot} for the case of normal (inverted) mass ordering. The pointing ability of the CCSN is evaluated by using the accumulated event anisotropy of the inverse beta decay interactions from pre-SN or SN neutrinos, which, along with the early alert, can play important roles for the followup multi-messenger observations of the next Galactic or nearby extragalactic CCSN.Comment: 24 pages, 9 figure

    INDIGO-DataCloud: A data and computing platform to facilitate seamless access to e-infrastructures

    Get PDF
    This paper describes the achievements of the H2020 project INDIGO-DataCloud. The project has provided e-infrastructures with tools, applications and cloud framework enhancements to manage the demanding requirements of scientific communities, either locally or through enhanced interfaces. The middleware developed allows to federate hybrid resources, to easily write, port and run scientific applications to the cloud. In particular, we have extended existing PaaS (Platform as a Service) solutions, allowing public and private e-infrastructures, including those provided by EGI, EUDAT, and Helix Nebula, to integrate their existing services and make them available through AAI services compliant with GEANT interfederation policies, thus guaranteeing transparency and trust in the provisioning of such services. Our middleware facilitates the execution of applications using containers on Cloud and Grid based infrastructures, as well as on HPC clusters. Our developments are freely downloadable as open source components, and are already being integrated into many scientific applications
    corecore